Deep Learning Exam

Perceptron

The symbol represents the prediction.
is the function that defines the line of the perceptron classifier.

Example

The formula for a 2 word classifier is , where:

  • is the weight of word 1. (10 in this case)
  • is the weight of word 2. (30 in this case)
  • c is the bias. (20 in this case)
Activation Function "step"

In this instance, a positive result would be turned to 1 by the activation function, and a negative one to 0.

The perceptron trick

We are talking about a method to train the perceptron.
There are 2 cases:

  • If a point is correctly classified: do nothing.
  • If a point is incorrectly classified, move the line towards that point by changing the weights and bias.

Example



Word2Vec

We represent each word in the dictionary with a vector(of variable dimensions).

Notation
  • The expression computes the vector difference between and , resulting in a new vector that points from to .
  • denote the "norm" of a vector, AKA the vector's length.

Cosine Similarity

Another way to measure how similar two vectors are, but this time it's based on the angle and not the vectors length.

Example

How to find the embeddings

We build a neural network to solve the following problem:

The architecture

The following is the architecture for skipgram, which has all the possible words in the first and last layer, each word has its own neuron:

On what data do we train?

We just make a sliding window on a text and we label as 1 the words that appear and 0 the words that don't appear.

Lil problem

The actual training [WIP]


Backpropagation algorithm